The number of deepfakes circulating on the internet has exploded since the term first appeared on internet on 2017. (Bloomberg)AI 

Governments Taking Action to Control Deepfakes as AI Becomes More Widely Used

The news paragraph includes images of Donald Trump embracing and kissing his former chief medical adviser, Dr. Anthony Fauci, as well as pornographic depictions of Hollywood actresses and internet influencers. Additionally, there is a photo of an explosion that occurred at the Pentagon.

All were found to be “deepfakes”, highly realistic audio and visual content created using rapidly developing artificial intelligence technology.

Those harmed by digital forgeries — especially women who appear in sexually intimate forgeries without consent — have few options for redress, and lawmakers across the country are now working to fill that gap.

“A pornographic deepfake presented honestly didn’t necessarily violate any existing law,” said Matthew Kugler, a law professor at Northwestern University who sponsored a deepfake bill currently under consideration by the governor in Illinois.

“You’re taking something public, your face, and something that’s completely another person’s, so under a lot of current laws and torts, there wasn’t a clear way to sue people for that,” he said.

Recent interest in the powers of generative artificial intelligence has already inspired several congressional hearings and proposals to regulate the emerging technology this year. But with the federal government deadlocked, state lawmakers have been quicker to advance laws aimed at combating the immediate harms of AI.

Nine states have passed laws regulating the use of deepfaking, mostly in connection with pornography and election influence, and at least four more states have laws in various stages of the legislative process.

California, Texas, and Virginia were the first states to enact deep anti-counterfeiting laws in 2019, ahead of the current AI frenzy. Minnesota last passed a deep counterfeit law in May, and a similar bill in Illinois is awaiting the governor’s signature.

“People often talk about slow, ice-age legislation, and this is an area where that’s definitely not true,” said Matthew Ferraro, an attorney at WilmerHale LLP who has followed counterfeiting laws closely.

Tech Driving Law

The term “deepfakes” first appeared on the Internet in 2017 when a Reddit user began posting fake porn videos that used artificial intelligence algorithms to digitally insert celebrity faces into real adult videos without permission.

Earlier this year, the proliferation of non-consensual pornographic deepfakes sparked controversy in the video game streaming community and highlighted some of the huge drawbacks and lack of legal recourse to unfettered deepfakes. Popular streamer QTCinderella, who said she was harassed by netizens sending her images, had threatened to sue the people behind the deepfakes, but was later told by lawyers that she had no case.

The number of deep fakes circulating on the Internet has exploded since then. Deeptrace Labs, a deepfake detection service, released a widely read report in 2019 that identified nearly 15,000 deepfake videos online, 96% of which were pornographic content featuring women. Sensity AI, which also detects deep fakes, said deepfake videos have grown exponentially since 2018.

“Technology is constantly improving, so it’s very difficult, unless you’re an expert in digital forensics, to tell if something is fake or not,” said Loyola Marymount University law professor Rebecca Delfino, a deep-fake researcher.

It only increases the spread of misinformation online and in political campaigns. An attack ad by GOP presidential candidate Ron DeSantis appeared to show Trump embracing Fauci in a series of images, but some of the images were created by artificial intelligence.

A fake but realistic photo that went viral on Twitter in May showed an explosion at the Pentagon, leading to a temporary drop in the stock market.

In one sense, synthetic media has existed for decades with basic photo manipulation techniques and more recently with programs like Photoshop. But the ease with which non-technical internet users can now create highly realistic digital fakes has led to new laws.

“That speed, that scale, that credibility and the availability of that technology have all kind of brought this witch’s brew together,” Ferraro said.

Finding cures

Without a specific pornographic counterfeit law, victims have limited legal recourse. A hodgepodge of intellectual property, privacy and defamation laws could, in theory, allow a victim to sue or obtain justice.

A federal court in Los Angeles is currently hearing a publicity lawsuit from a reality TV celebrity who said she never gave her permission for an artificial intelligence app that allows users to digitally superimpose their faces onto hers. But publicity laws, which vary from state to state, protect an image only when it is used for commercial purposes.

Forty-eight states have a criminal ban on the distribution of revenge porn, and some have laws against “cropping,” which means photographing another person’s private parts without permission. A victim can also sue for defamation, but those laws wouldn’t necessarily apply if the deep forgery says it’s a fake, said Kugler, the Northwestern law professor.

Caroline Ford, a lawyer at Minc Law who specializes in helping victims of revenge porn, said that while many victims could get relief under these laws, the rules were not written with deep counterfeiting in mind.

“A provision that makes it very clear to the courts that the Legislature is trying to see a major disadvantage here and is trying to fix that disadvantage is always better in these situations,” he said.

State Patchwork

State laws have varied to date.

In Hawaii, Texas, Virginia, and Wyoming, pornographic counterfeiting is only a felony, while New York and California laws only create a private right of action that allows victims to file civil lawsuits. The new Minnesota law defines both criminal and civil penalties.

Finding the right party to sue can be difficult, and local law enforcement isn’t always cooperative, said Ford, who has handled revenge porn cases. Many of his clients just want the pictures or videos removed and don’t have the resources to sue.

The definition of deepfake also varies from state to state. Some like Texas directly refer to artificial intelligence, while others just include language like “computer generated image” or “digitization.”

Many of these states have simultaneously changed their election laws to prevent deep falsification in campaign ads in the period leading up to the election.

Concerns about freedom of speech

Like most new technologies, deep fakes can be used for harmless reasons: making parodies, reviving historical figures, or copying movies, all of which are activities protected by the First Amendment.

Finding a balance that prohibits harmful deep counterfeits and protects legitimate counterfeits is not easy. “You can see that policymakers are really struggling,” said Delfino, the Loyola law professor.

The ACLU of Illinois initially opposed the state’s pornographic deepfakes law, arguing that while deepfakes can cause real harm, the bill’s broad provisions and its immediate takedown clause could “chill or silence a large amount of protected speech.”

The recent changes changed the law by adding deep fakes to Illinois’ current revenge porn law, a “significant improvement,” Ed Yohnka, the organization’s director of communications, said in an email. “We remain concerned that the language lowers existing statutory thresholds,” he said.

Delfino said the fake bill introduced in Congress last month could raise similar concerns because its exemptions are limited to “legitimate public concerns.”

He noted that the California statute contains express references to First Amendment protections. If Congress wants to “really take this seriously, they’re going to have to do a little more work on this proposal,” he said.

Kugler said the first deepfake laws have mostly targeted non-consensual pornography because those cases are the “low-hanging fruit” of free speech issues. The emotional distress and damage to dignity and reputation are clear, while the benefits of free speech are few, he said.

Delfino has long advocated for stronger revenge porn laws and has followed the rise of deep fake porn since it first gained attention. He said he’s glad the new interest in artificial intelligence is generally pushing for stricter laws.

“Like a lot of things involving crimes against women and the objectification of women and minorities, they get attention every now and then, and then the public kind of moves on,” she said. “But now people are coming back and worrying about deepfake techniques again.”

Related posts

Leave a Comment